The real estate industry moves fast. It wasn’t so long ago that potential buyers narrowed their searches by driving around with a sheaf of printed listings, and designers and builders relied on CAD drawings and artist renderings to show yet-to-be-built spaces. Nowadays, advances in graphics technology have brought us interactive 3D renderings, making it easier for investors, buyers, and other stakeholders to truly understand the designs they’re looking at.
Following on from these advances, we’re also seeing technology that can excite and entice buyers, like sales configurators and interactive tours, where visitors can choose finishes and design their own spaces right before their eyes; virtual reality experiences, where architects can get feedback from investors on design at key stages of the process; shadow studies, where potential occupants can see how a space will be affected by sunlight at various times of day; and digital twins, where cities can get a true idea of a building’s usage, which leads to new ways to optimize efficiencies and design better spaces for residents.
Many of these applications aren’t new—shadow and massing studies, for example, have been around for decades—but the difference is in how they’re delivered, and in their high level of interactivity. Imagine a combination configurator / shadow study tool where users can, by tapping a few buttons, see how various finishes would look at different times of day. Such applications are not only possible, they can now be pushed to just about any device, from smartphones and tablets to PCs and kiosks, making them accessible to anyone with an internet connection.
We’re reaching a point in the proptech journey where, if you can build it virtually, anyone can experience and explore it visually, interactively, on just about any device.
What these new types of experiences have in common is real-time technology—the ability to render in real time, and to implement game-like controls such as unlimited movement through 3D spaces, on-screen buttons and sliders that cause instantaneous changes to materials and lighting, and the ability to read live data and update the scene accordingly. And behind real-time technology is a real-time engine—more specifically, a game engine.
How could a game engine be involved? Game engines have always included real-time rendering—as a player looks or moves around, the environment is rendered in real time. And while the graphics in video games of yesteryear were often blocky or clunky, these engines now include features like physically based materials and ray tracing, making them capable of a level of visual quality that rivals many offline renderers.
In fact, architectural design leaders Zaha Hadid, ARUP, and HOK are already embracing real-time technology, using it to develop new ways to communicate design to builders, gain approval from investors, and garner interest from buyers. Epic Games has taken a strong interest in these new applications because its game engine, Unreal Engine, is behind many of them.
To bring awareness to these new uses for real-time technology, Epic is hosting a real estate episode of The Pulse, a video series devoted to industries making use of game engines to entice, engage, and inform with these new types of experiences.
In The Pulse: Real-Time Real Estate: Visualize, Connect, Build on September 21, you can engage in discussion with forward-thinking companies using real-time technology to inspire interest, educate stakeholders, and sell real estate. Join host James Scott, Director of the Real Estate Technology Initiative at MIT Center for Real Estate, as he discusses the digital transformation within real estate with industry experts Edward Wagoner, CIO Digital at JLL Technologies; Dorian Vee, CEO/Founder at IMERZA; and Sam Anderson of Epic Games. At the live Q&A afterward, you can pose your questions about how you can leverage real-time technology to achieve your company’s goals, and how to get started in this brave new world. Register today.